UK’s largest police force spends over £200,000 on facial recognition trials that resulted in no arrests
Exclusive: Critics accuse Metropolitan Police of wasting public money on ‘dangerously inaccurate’ technology
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Britain’s largest police force has spent more than £200,000 on controversial facial recognition trials that resulted in no arrests, figures reveal.
A freedom of information request by The Independent showed six deployments by the Metropolitan Police resulted in only two people being stopped, and then released.
Critics called the force’s use of facial recognition a “shambles” and accused authorities of wasting public money.
Despite the lack of arrests in London, the National Police Chiefs’ Council is considering drawing up national guidance on how the technology should be used.
Trials carried out between August 2016 and July last year saw 110 people’s faces registered as potential “alerts” against watchlists of wanted criminals.
The vast majority came at 2017’s Notting Hill Carnival, while the same event in 2016 saw just one alert amid accusations of racial bias.
There were seven alerts during a Remembrance Day service at the Cenotaph in November 2017, five at the Westfield Stratford shopping centre in June last year, one in the same location in July, and zero at the Port of Hull.
The Metropolitan Police has described the deployments as “overt” and said members of the public were informed facial recognition was being used by posters and leaflets.
But no one questioned by The Independent after they passed through a scanning zone in central London last month had seen police publicity material, and campaigners claim the technology is being rolled out “by stealth”.
The Metropolitan Police spent £198,000 on automatic facial recognition software from Japan’s NEC Corporation between the 2015-16 and 2017-18 financial years.
In the same period, it paid £23,800 for hardware, including the cameras used to record people’s faces.
On top of the total spend of £222,000, the force has paid for uniformed and undercover police officers at each deployment but said the cost could not be precisely calculated.
Campaigners at Big Brother Watch, which has launched a legal challenge over the Metropolitan Police’s use of facial recognition, said the force was “wasting public money”.
Director Silkie Carlo said: “I think members of the public will be disappointed to see the police have spent over £200,000 in this shambles experiment playing with facial recognition and citizens’ liberties.
“The figures show, yet again, that this authoritarian surveillance is dangerously inaccurate and poses a serious risk to public freedoms.”
Hannah Couchman, a policy officer at Liberty, accused Scotland Yard of launching a “creeping expansion” of the technology without fully engaging with discrimination and human rights concerns.
“Facial recognition is a mass surveillance tool that can force us to change our behaviour and is least accurate when used to ‘identify’ women and BME people – meaning they are more likely to be subject to an intrusive police stop following a ‘false match’,” she added.
“While the Met continue to ignore legitimate human rights concerns, they do nothing to contradict what we already know – that this technology has no place on our streets.”
Automatic facial recognition software compares live footage of people’s faces to photos from a watchlist of selected images from a police database.
Any potential matches are flashed up as an alert to officers, who then compare the faces and decide whether to stop someone.
A spokesperson for the Metropolitan Police said that a deployment in central London in December, after the period covered by the freedom of information request, resulted in two people wanted for violence offences being arrested.
“Tackling violent crime is a key priority for the Met and we are determined to use all emerging technology available to support standard policing activity and help protect our communities,” she added.
“The technology being tested in this trial is developing all the time and has the potential to be invaluable to day to day policing.
“We continue to constantly engage with key stakeholders and partners whilst this trial continues.”
Eight of 10 trials have taken place and a “full independent evaluation” is to be carried out.
Only one other force in the UK – South Wales Police – is currently using live facial recognition, while some others manually use software to comb the Police National Computer for matches to unidentified people.